Ljudska pouzdanost
Ljudska pouzdanost se odnosi na oblast ljudskog faktora i ergonomiju, a odnosi se na pouzdanost ljudi u oblastima kao što su proizvodnja, prevoz, vojne ili medicine. Ljudski učinak može uticati mnogim faktorima kao što su dob, stanje uma, fizičko zdravlje, stav, emocije, sklonosti za određene zajedničke greške, greške i kognitivnih predrasude, itd
Ljudska pouzdanost je vrlo važno zbog doprinosa ljudi na otpornost sistema i na moguće negativne posljedice ljudske greške ili propusta, posebno kada ljudski je ključni dio velikog socio-tehničkih sistema kao što je zajednička danas. User-centered design i pogreške-tolerantni dizajna su samo dva od mnogih pojmova koji se koriste za opisivanje napore kako bi tehnologiju bolje odgovara za rad od strane ljudi.`
Analize tehnika
[уреди | уреди извор]Postoje Postoje različite metode za analizu ljudskih pouzdanosti.[1][2] Dva opće klase metoda su one zasnovane na probabilističke procjene rizika ( PRA ) i onih koji se baziraju na kognitivne teorije kontrole
Tehnike PRA
[уреди | уреди извор]Jedna od metoda za analizu ljudskih pouzdanosti je jednostavno proširenje probabilističke procjene rizika (PRA): na isti način na koji oprema može propasti u elektrani, tako da se ljudsko operatera počini greške. U oba slučaja, analiza (funkcionalne dekompozicije za opremu i analizu zadataka za ljude) bi artikulirati nivo detalja za koje kvara ili greške vjerojatnosti može biti dodijeljen. Ova osnovna ideja je iza tehnika za predivi]anje stope ljudske greške (THERP).[3] THERP je namijenjen za generiranje vjerojatnosti ljudske greške koje će biti uključene u PRA. U programu za procenu redosleda nezgoda (ASEP) postupak ljudskih pouzdanost je pojednostavljen oblik THERP; Asocirani računarski alat je pojednostavljen Analiza Human Error Code (SHEAN).[4] Nedavno je Nuklearna regulatorna komisija SAD je objavila Standardizovanu analizu rizika postrojenja, analizu ljudske pouzdanost (SPAR-H), metoda kojim se uzema u obzir potencijal za ljudsko greške.[5][6]
Tehnike kognitivne kontrole
[уреди | уреди извор]Erik Hollnagel je razvio ovu liniju razmišljanja u svom radu na kontekstualno Control Model ( COCOM ) [ 7 ] i kognitivne Pouzdanost i analiza Error Method ( KREMA ). [ 8 ] COCOM modeli ljudskih performansi kao skup kontrole prikaza - strateške ( na osnovu dugoročno planiranje ), taktička ( na osnovu procedure ), oportunistički ( zasnovano na ovom kontekstu ), i kodiran ( random ) - i predlaže model kako doći do prelaza između ovih kontrolu prikaza. Ovaj model kontrole tranzicije režim se sastoji od niza faktora, uključujući i procjenu ljudskih operatera o ishodu akcije ( uspjeh ili neuspjeh ), vremena preostalo da ostvari akciju ( adekvatno ili neadekvatno ), a broj istovremenih ciljeva ljudskih operatera u to vrijeme. Krema je pouzdanost metoda za ljudska analiza koja se temelji na COCOM .
Reference
[уреди | уреди извор]- ^ Kirwan & Ainsworth 1992
- ^ Kirwan 1994
- ^ Swain & Guttmann, 1983
- ^ Simplified Human Error Analysis Code (Wilson, 1993)
- ^ SPAR-H
- ^ Gertman et al., 2005
Literatura
[уреди | уреди извор]- Gertman, D. L.; Blackman, H. S. (2001). Human reliability and safety analysis data handbook. Wiley.
- Gertman, D.; Blackman, H.; Marble, J.; Byers, J.; Smith, C. (2005). The SPAR-H human reliability analysis method. NUREG/CR-6883. Idaho National Laboratory, prepared for U. S. Nuclear Regulatory Commission.[1]
- M. Cappelli; A.M.Gadomski; M.Sepielli (2011). Human Factors in Nuclear Power Plant Safety Management: A Socio-Cognitive Modeling Approach using TOGA Meta-Theory. Proceedings of International Congress on Advances in Nuclear Power Plants. Nice (FR). SFEN (Société Française d'Energie Nucléaire).
- Hollnagel, E. (1993). Human reliability analysis: Context and control. Academic Press.
- Hollnagel, E. (1998). Cognitive reliability and error analysis method: CREAM. Elsevier.
- Hollnagel, E.; Amalberti, R. (2001). The Emperor's New Clothes, or whatever happened to "human error"? Invited keynote presentation at 4th International Workshop on Human Error, Safety and System Development. Linköping, June 11–12, 2001.
- Hollnagel, E.; Woods, D. D., ур. (2006). Resilience engineering: Concepts and precepts. Ashgate.
- Jones, P. M. (1999). Human error and its amelioration. In Handbook of Systems Engineering and Management (A. P. Sage and W. B. Rouse, eds.), 687-702. Wiley.
- Kirwan, B. (1994). A Guide to Practical Human Reliability Assessment. Taylor & Francis.
- Kirwan, B.; Ainsworth, L. (, ур. (1992). A guide to task analysis. Taylor & Francis.
- Norman, D. (1988). The psychology of everyday things. Basic Books.
- Reason, J. (1990). Human error. Cambridge University Press.
- Roth, E.; et al. (1994). An empirical investigation of operator performance in cognitive demanding simulated emergencies. NUREG/CR-6208, Westinghouse Science and Technology Center. Report prepared for Nuclear Regulatory Commission.
- Sage, A. P. (1992). Systems engineering. Wiley.
- Senders, J.; Moray, N. (1991). Human error: Cause, prediction, and reduction. Lawrence Erlbaum Associates.
- Shappell, S.; Wiegmann, D. (2000). The human factors analysis and classification system - HFACS. DOT/FAA/AM-00/7, Office of Aviation Medicine, Federal Aviation Administration, Department of Transportation.[2]
- Swain, A. D.; Guttman, H. E. (1983). Handbook of human reliability analysis with emphasis on nuclear power plant applications. NUREG/CR-1278 (Washington D.C.).
- Wallace, B.; Ross, A. (2006). Beyond human error. CRC Press.
- Wiegmann, D.; Shappell, S. (2003). A human error approach to aviation accident analysis: The human factors analysis and classification system. Ashgate.
- Wilson, J.R. (1993). SHEAN (Simplified Human Error Analysis code) and automated THERP. United States Department of Energy Technical Report Number WINCO--11908. [3]
- Woods, D. D. (1990). Modeling and predicting human error. In J. Elkind, S. Card, J. Hochberg, and B. Huey (Eds.), Human performance models for computer-aided engineering (248-274). Academic Press.
- Federal Aviation Administration. 2009 electronic code of regulations. Retrieved September 25, 2009, from https://web.archive.org/web/20120206214308/http://www.airweb.faa.gov/Regulatory_and_Guidance_library/rgMakeModel.nsf/0/5a9adccea6c0c4e286256d3900494a77/$FILE/H3WE.pdf
- Autrey, T.D. (2015). 6-Hour Safety Culture: How to Sustainably Reduce Human Error and Risk (and do what training alone can't possibly do). Human Performance Association.
- Davies, J.B.; Ross, A.; Wallace, B.; Wright, L. (2003). Safety Management: a Qualitative Systems Approach. Taylor and Francis.
- Dekker, S.W.A. (2005). Ten Questions About Human Error: a new view of human factors and systems safety. Lawrence Erlbaum Associates.
- Dekker, S.W.A. (2006). The Field Guide to Understanding Human Error. Ashgate.
- Dekker, S.W.A. (2007). Just Culture: Balancing Safety and Accountability. Ashgate.
- Dismukes, R. K.; Berman, B. A.; Loukopoulos, L. D. (2007). The limits of expertise: Rethinking pilot error and the causes of airline accidents. Ashgate.
- Forester, J.; Kolaczkowski, A.; Lois, E.; Kelly, D. (2006). Evaluation of human reliability analysis methods against good practices. NUREG-1842 Final Report. U. S. Nuclear Regulatory Commission. [4]
- Goodstein, L. P.; Andersen, H. B.; Olsen, S. E. (1988). Tasks, errors, and mental models. Taylor and Francis.
- Grabowski, M.; Roberts, K. H. (1996). „Human and organizational error in large scale systems”. IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans. 26: 2—16. doi:10.1109/3468.477856.
- Greenbaum, J.; Kyng, M. (1991). Design at work: Cooperative design of computer systems. Lawrence Erlbaum Associates.
- Harrison, M. (2004). Human error analysis and reliability assessment. Workshop on Human Computer Interaction and Dependability, 46th IFIP Working Group 10.4 Meeting, Siena, Italy, July 3–7, 2004. [5]
- Hollnagel, E. (1991). The phenotype of erroneous actions: Implications for HCI design. In G. W. R. Weir and J. L. Alty (Eds.), Human-computer interaction and complex systems. Academic Press.
- Hutchins, E. (1995). Cognition in the wild. MIT Press.
- Kahneman, D.; Slovic, P.; Tversky, A. (Eds.) (1982). „Judgment under uncertainty: Heuristics and biases”. Science. Cambridge University Press. 185 (4157): 1124—31. PMID 17835457. S2CID 143452957. doi:10.1126/science.185.4157.1124.
- Leveson, N. (1995). Safeware: System safety and computers. Addison-Wesley.
- Morgan, G. (1986). Images of Organization. Sage.
- Mura, S. S. (1983). Licensing violations: Legitimate violations of Grice's conversational principle. In R. Craig and K. Tracy (Eds.), Conversational coherence: Form, structure, and strategy (101-115). Sage.
- Perrow, C. (1984). Normal accidents: Living with high-risk technologies. Basic Books. ISBN 9780465051441.
- Rasmussen, J. (1983). Skills, rules, and knowledge: Signals, signs, and symbols and other distinctions in human performance models. IEEE Transactions on Systems, Man, and Cybernetics, SMC-13, 257-267.
- Rasmussen, J. (1986). Information processing and human-machine interaction: An approach to cognitive engineering. Wiley.
- Silverman, B. (1992). Critiquing human error: A knowledge-based human-computer collaboration approach. Academic Press.
- Swets, J. (1996). Signal detection theory and ROC analysis in psychology and diagnostics: Collected papers. Lawrence Erlbaum Associates.
- Tversky, A.; Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185, 1124-1131.
- Vaughan, D. (1996). The Challenger launch decision: Risky technology, culture, and deviance at NASA. University of Chicago Press. ISBN 9780226851761.
- Woods, D. D.; Johannesen, L.; Cook, R.; Sarter, Nadine (1994). Behind human error: Cognitive systems, computers, and hindsight. CSERIAC SOAR Report 94-01. Crew Systems Ergonomics Information Analysis Center, Wright-Patterson Air Force Base, Ohio.
- Wu, S.; Hrudey, S.; French, S.; Bedford, T.; Soane, E.; Pollard, S. (2009). „A role for human reliability analysis (HRA) in preventing drinking water incidents and securing safe drinking water” (PDF). Water Research. 43 (13): 3227—3238. PMID 19493557. doi:10.1016/j.watres.2009.04.040.
- CCPS, Guidelines for Preventing Human Error. This book explains about qualitative and quantitative methodology for predicting human error. Qualitative methodology called SPEAR: Systems for Predicting Human Error and Recovery, and quantitative methodology also includes THERP, etc.
Spoljašnje veze
[уреди | уреди извор]- IEEE Standard 1082 (1997): IEEE Guide for Incorporating Human Action Reliability Analysis for Nuclear Power Generating Stations Архивирано на сајту Wayback Machine (30. октобар 2017)
- DOE Standard DOE-HDBK-1028-2009 : Human Performance Improvement Handbook
- EPRI HRA Calculator
- Eurocontrol Human Error Tools
- RiskSpectrum HRA software Архивирано на сајту Wayback Machine (3. новембар 2014)
- Simplified Human Error Analysis Code
- Erik Hollnagel at the Crisis and Risk Research Centre at MINES ParisTech
- Human Reliability Analysis Архивирано на сајту Wayback Machine (15. октобар 2011) at the US Sandia National Laboratories
- Center for Human Reliability Studies at the US Oak Ridge National Laboratory
- Flight Cognition Laboratory at NASA Ames Research Center
- David Woods at the Cognitive Systems Engineering Laboratory at The Ohio State University
- Sidney Dekker's Leonardo da Vinci Laboratory for Complexity and Systems Thinking, Lund University, Sweden
- “How to Avoid Human Error in IT“ Архивирано на сајту Wayback Machine (4. март 2016)
- “Human Reliability. We break down just like machines“ Industrial Engineer -. 36 (11): 66
- High Reliability Management group at LinkedIn.com